Web Survey Bibliography
Interactive web questionnaires promise to improve survey measurement relative to more static modes, whether online or paper. By designing questionnaires that react to respondent actions associated with reduced data quality, it may be possible to promote behavior that leads to improved quality. We are investigating this type of interactivity by giving feedback to respondents (“speeders”) when they answer so fast they cannot realistically have read the question let alone thought about the answer (“You seem to have responded very quickly. Please be sure you have given the question sufficient thought to provide an accurate answer.”). In prior research (reported at AAPOR, 2009) we observed that, overall, speeders answered questions about quantities (e.g., “Overall, how many overnight trips have you taken in the PAST 2 YEARS?”) more slowly when they were prompted than when they were not. We assume the slowdown reflects improved data quality – they also were less likely to straightline on later grid items – but we cannot be sure without a direct measure of response accuracy. In the current research we explored the relationship between response time and quality by prompting speeders on simple numeracy items for which we could determine response accuracy (e.g., “If the chance of getting a disease is 10%, how many people out of 100 would be expected to get the disease: 1, 10 or 20?”). Half of the respondents were prompted whenever they answered faster than 300 msec per word for each of seven numeracy items; the other respondents were never prompted. Because the prompt is relatively punitive in tone – in effect, chastising respondents for speeding – we also tried to motivate respondents at the outset to be conscientious when answering. Half of the respondents were asked to commit to reading the question carefully and thinking about the answer before submitting it; the other respondents were not asked to commit. By crossing prompting (yes or no) with commitment (yes or no) we produced four experimental conditions to which 2565 were randomly assigned. As in the earlier studies, respondents in the prompt group answered more slowly overall than the no prompt group. The prompt also increased response accuracy for a subset of respondents with moderate levels of education (Some College/Associates Degree). These respondents were 5% more accurate (55 vs. 50%) when they were prompted than not. Those with more education were able to speed while accurately responding and those with the least education were inaccurate despite answering more slowly. The prompts reduced straightlining in later grid questions across different education levels suggesting the intervention was taken to heart even when it did not affect accuracy of answers to the numeracy questions. And as in the earlier studies, the intervention did not increase breakoffs. Commitment had a complementary effect. It slowed responses as much as did the prompt (there were main effects of both prompting and commitment on response time but no interaction) and increased accuracy for respondents with the highest levels of education: those with a Bachelor’s Degree increased from 60 to 65% and those with a Masters or more increased from 64 to 71% when they committed to careful responding. Like the prompting, commitment did not increase breakoffs. So it may be that both approaches – carrots and sticks – used together can promote more thoughtful and, ultimately, more accurate answers across the pool of respondents. While reliable, the advantage conferred by prompting and commitment on accuracy is modest (5 to 6%). Nonetheless, given concerns about data quality in web surveys – especially when nonprobability panels are used –gains of this size are a welcome improvement and may be larger for other types of items for which aptitude is less central. In future research we will explore interventions for other behaviors beside speeding, e.g., primacy effects, straightlining and conditioning. The fact that the prompting for speeding did not increase breakoffs suggests respondents may tolerate these other kinds of intervention as well.
Workshop Homepage (abstract) / (presentation)
Web survey bibliography - 2011 (358)
- The Validity of Surveys: Online and Offline; 2016; Wiersma, W.
- Computer science security research and human subjects: Emerging considerations for research ethics boards...; 2013; Buchanan, E. A., Aycock, J., Dexter, S., Dittrich, D., Hvizdak, E. E.
- Multiple Sources of Nonobservation Error in Telephone Surveys: Coverage and Nonresponse; 2011; Peytchev, A.; Carley-Baxter, L. R.; Black, M. C.
- Online Questionnaires for Outbreak Investigations; 2011; Parry, A. E.; Johnson, D. R.; Byron-Gray, K.; Raupach, J. C. A.; McPherson, M.
- Inventory of published research: Response burden measurement and reduction in official business statistics...; 2011; Giesen, D. & Snijkers, G. (Eds.), Bavdaz, M., Bergstrom, Y., Gravem, D. F., Haraldsen, G., Hedlin, D...
- Effects of speeding on satisficing in Mixed-Mode Surveys; 2011; Bathelt, S., Bauknecht, J.
- Using Research-Based Practices to Increase Response Rates of Web-Based Surveys; 2011; Perkins, R. A.
- Using break-offs in web interviews for predicting web response in mixed mode surveys; 2011; Beukenhorst, D.
- Web panels in Slovenia; 2011; Lenar, J.
- Traditional and non-traditional treatments for autism spectrum disorder with seizures: an on-line survey...; 2011; Frye, R. E., Sreenivasula, S., Adams, J. B.
- Understanding the new digital divide—A typology of Internet users in Europe; 2011; Brandtzæg, P.B.; Heim, J.; Karahasanoviæ, A.
- Patients’ attitudes toward side effects of antidepressants: an Internet survey; 2011; Kikuchi, T., Uchida, H., Suzuki, T., Watanabe, K., Kashima, H.
- Web-based or paper-based surveys: a quandary?; 2011; Bennett, L., Sid Nair, C.
- Refining the Total Survey Error Perspective; 2011; Smith, T. W.
- ELIPSS: Étude Longitudinale par Internet Pour les Sciences Sociales; 2011; Legleye, S., Lesnard, L.
- Less questions, more data: Revitalizing the european currency in single source affluent audience measurement...; 2011; Hartman, H.
- Linking website exposure data to survey data: A single-source solution; 2011; Krahn, J., Landi, J., Melton, E.
- Inference in surveys with sequential mixed-mode data collection; 2011; Buelens, B., van der Brakel, J.
- Using a Probability-based Online Panel to Survey American Jews; 2011; Wright, G., Phillips, B. T., Tobias, J., Peugh, J., Semans, K.
- Choice of Content Presentation Mode in Web-Based Survey Administration; 2011; Osborn, L., Mansfield, W., Ramirez, C. M., Lacey, J. N., etc.
- Seasonal Yield Variation and Related Response Patterns in Address-based Mail Samples; 2011; DiSogra, C., Hendarwan, E.
- Gender-specific on-line shopping preferences; 2011; Ulbrich, F., Christensen, T., Stankus, L.
- Mixing modes in the LFS - Computer-assisted, cost effective and respondent friendly; 2011; Koerner, T., van der Valk, J.
- Peanuts and Monkeys: Incentivisation and engagement in online access panels; 2011; Marks, B.
- Establishing Cross-National Equivalence of Measures of Xenophobia: Evidence from Probing in Web Surveys...; 2011; Braun, M., Behr, D., Kaczmirek, L.
- Methodological challenges in the use of the Internet for scientific research: Ten solutions and recommendations...; 2011; Reips, U.-D., Buchanan, T., Krantz, J. H., McGrawn, K.Reips, U.-D.
- Search and email still top the list of most popular online activities; 2011; Purcell, K.
- Using Internet in Stated Preference Surveys: A Review and Comparison of Survey Modes; 2011; Lindhjem, H., Navrud, S.
- On the experience and evidence about mixing modes of data collection in large-scale surveys where the...; 2011; Dex, S., Gumy, J.
- Survey Gamification: Old Wine in New Bottles?; 2011; Baker, R. P.
- The Game Experiments: Researching how gaming techniques can be used to improve the quality of feedback...; 2011; Sleep, D., Puleston, J.
- Statistical Estimation of Word Acquisition With Application to Readability Prediction; 2011; Kidwell, P., Lebanon, G., Collins-Thompson, K.
- What is Probit; 2011
- Voice-of-the-customer marketing: A revolutionary 5-step process to create customers who care, spend,...; 2011; Roman, E.
- User agent; 2011
- Unpublisihed internal Google report on break off rates by device type; 2011; Callegaro, M.
- Toward wiser public judgment; 2011; Yankelovich, D., Friedman, W.
- The impact of cookie deletion on site-server and ad-server metrics in Australia. An empirical comScore...; 2011
- The changing role of address-based sampling in survey research; 2011; Iannacchione, V. G.
- State of mobile measurement; 2011; Gluck, M.
- Some issues in the application of latent class models for questionnaire design; 2011; Biemer, P. P., Berzofsky, M.
- Self-administered mobile surveys; 2011; Bosnjak, M.
- SDSC Announces scalable, high-performance data storage cloud; 2011
- Ratings and audience measurement; 2011; Napoli, P. M.
- Randomized response models in survey sampling. Randomized response models; 2011; Hussain, Z.
- Online survey research: Findings, best practices, and future research. Report prepared for the Advertising...; 2011; Vannette, D.
- Online survey research: Findings, Best practices, and future research; 2011
- New Esomar survey on use of cookies and tracking technologies; 2011
- Mobile, webmail, desktops: Where are we viewing email now?; 2011
- Measuring americans' issue priorities. A new version of the most important problem question reveals...; 2011; Yeager, D. S., Larson, S. B., Krosnick, J. A., Tompson, T.